Regularization for Neural Networks
نویسنده
چکیده
Research into regularization techniques is motivated by the tendency of neural networks to to learn the specifics of the dataset it was trained on rather than learning general features that are applicable to unseen data. This is known as overfitting. The goal of any supervised machine learning task is to approximate a function that maps inputs to outputs, given a dataset of examples and labels. An important assumption is that models are trained on datasets which are representative of the true distribution of the data and the target function to be approximated. However, almost all data is noisy and contains some random deviations from the true distribution. Given this fact, approximating the function represented by the training data very precisely is undesirable. The algorithm will learn the noise in the training dataset and will be unlikely to perform well when applied to unseen data. Regularizing neural networks helps them to learn the true function and ignore the noise.
منابع مشابه
Regularization Parameter Selection for Faulty Neural Networks
Regularization techniques have attracted many researches in the past decades. Most focus on designing the regularization term, and few on the optimal regularization parameter selection, especially for faulty neural networks. As is known that in the real world, the node faults often inevitably take place, which would lead to many faulty network patterns. If employing the conventional method, i.e...
متن کاملPredictive Abilities of Bayesian Regularization and Levenberg–Marquardt Algorithms in Artificial Neural Networks: A Comparative Empirical Study on Social Data
The objective of this study is to compare the predictive ability of Bayesian regularization with Levenberg–Marquardt Artificial Neural Networks. To examine the best architecture of neural networks, the model was tested with one-, two-, three-, four-, and five-neuron architectures, respectively. MATLAB (2011a) was used for analyzing the Bayesian regularization and Levenberg–Marquardt learning al...
متن کاملOptimized Combination, Regularization, and Pruning in Parallel Consensual Neural Networks
Optimized combination, regularization, and pruning is proposed for the Parallel Consensual Neural Networks (PCNNs) which is a neural network architecture based on the consensus of a collection of stage neural networks trained on the same input data with different representations. Here, a regularization scheme is presented for the PCNN and in training a regularized cost function is minimized. Th...
متن کاملForecasting of heavy metals concentration in groundwater resources of Asadabad plain using artificial neural network approach
Nowadays 90% of the required water of Iran is secured with groundwater resources and forecasting of pollutants content in these resources is vital. Therefore, this research aimed to develop and employ the feedforward artificial neural network (ANN) to forecast the arsenic (As), lead (Pb), and zinc (Zn) concentration in groundwater resources of Asadabad plain. In this research, the ANN models we...
متن کاملA constrained regularization approach for input-driven recurrent neural networks
We introduce a novel regularization approach for a class of inputdriven recurrent neural networks. The regularization of network parameters is constrained to reimplement a previously recorded state trajectory. We derive a closed-form solution for network regularization and show that the method is capable of reimplementing harvested dynamics. We investigate important properties of the method and...
متن کاملA Comparative Study on Regularization Strategies for Embedding-based Neural Networks
This paper aims to compare different regularization strategies to address a common phenomenon, severe overfitting, in embedding-based neural networks for NLP. We chose two widely studied neural models and tasks as our testbed. We tried several frequently applied or newly proposed regularization strategies, including penalizing weights (embeddings excluded), penalizing embeddings, reembedding wo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016